Description of the Minimizers of Least Squares Regularized with 퓁0-norm. Uniqueness of the Global Minimizer
نویسنده
چکیده
We have an M × N real-valued arbitrary matrix A (e.g. a dictionary) with M < N and data d describing the sought-after object with the help of A. This work provides an in-depth analysis of the (local and global) minimizers of an objective function Fd combining a quadratic data-fidelity term and an l0 penalty applied to each entry of the sought-after solution, weighted by a regularization parameter β > 0. For several decades, this objective has attracted a ceaseless effort to conceive algorithms approaching a good minimizer. Our theoretical contributions, summarized below, shed new light on the existing algorithms and can help the conception of innovative numerical schemes. To solve the normal equation associated with any M-row submatrix of A is equivalent to compute a local minimizer û of Fd. (Local) minimizers û of Fd are strict if and only if the submatrix, composed of those columns of A whose indexes form the support of û, has full column rank. An outcome is that strict local minimizers of Fd are easily computed without knowing the value of β. Each strict local minimizer is linear in data. It is proved that Fd has global minimizers and that they are always strict. They are studied in more details under the (standard) assumption that rank(A) = M < N. The global minimizers with M-length support are seen to be impractical. Given d, critical values βK for any K 6 M− 1 are exhibited such that if β > βK, all global minimizers of Fd are K-sparse. An assumption on A is adopted and proved to fail only on a closed negligible subset. Then for all data d beyond a closed negligible subset, the objective Fd for β > βK, K 6 M − 1, has a unique global minimizer and this minimizer is K-sparse. Instructive small-size (5 × 10) numerical illustrations confirm the main theoretical results.
منابع مشابه
A pr 2 01 3 Description of the minimizers of least squares regularized with l 0 - norm . Uniqueness of the global minimizer
We have an M × N real-valued arbitrary matrix A (e.g. a dictionary) with M < N and data d describing the sought-after object with the help of A. This work provides an indepth analysis of the (local and global) minimizers of an objective function Fd combining a quadratic data-fidelity term and an l0 penalty applied to each entry of the sought after solution, weighted by a regularization paramete...
متن کاملDescription of the minimizers of least squares regularized with l0-norm. Uniqueness of the global minimizer
We have an M × N real-valued arbitrary matrix A (e.g. a dictionary) with M < N and data d describing the sought-after object with the help of A. This work provides an in-depth analysis of the (local and global) minimizers of an objective function Fd combining a quadratic data-fidelity term and an l0 penalty applied to each entry of the sought-after solution, weighted by a regularization paramet...
متن کاملStability of Minimizers of Regularized Least Squares Objective Functions Ii: Study of the Global Behavior
We address estimation problems where the sought-after solution is defined as the minimizer of an objective function composed of a quadratic data-fidelity term and a regularization term. We especially focus on nonsmooth and/or nonconvex regularization terms because of their ability to yield good estimates. This work is dedicated to the stability of the minimizers of such nonsmooth and/or nonconv...
متن کاملGlobal least squares solution of matrix equation $sum_{j=1}^s A_jX_jB_j = E$
In this paper, an iterative method is proposed for solving matrix equation $sum_{j=1}^s A_jX_jB_j = E$. This method is based on the global least squares (GL-LSQR) method for solving the linear system of equations with the multiple right hand sides. For applying the GL-LSQR algorithm to solve the above matrix equation, a new linear operator, its adjoint and a new inner product are dened. It is p...
متن کاملRelationship between the optimal solutions of least squares regularized with L0 -norm and constrained by k-sparsity
Two widely used models to find a sparse solution from a noisy underdetermined linear system are the constrained problem where the quadratic error is minimized subject to a sparsity constraint, and the regularized problem where a regularization parameter balances the minimization of both quadratic error and sparsity. However, the connections between these two problems have remained unclear so fa...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- SIAM J. Imaging Sciences
دوره 6 شماره
صفحات -
تاریخ انتشار 2013